alently, RLR applies a constraint to the weights

‖ܟ‖൑ݏ

(4.39)

this constraint, the regression coefficient of one variable

, the regression coefficients of the other variables decrease. For

suppose there are two variables for a RLR model, thus there are

ession coefficients, i.e., ݓ and ݓ. Because ݓ

൅ݓ

൑ݏ, if ݓ

, ݓ decreases. Suppose ݓ is increased by ߜ. Because ݓ

he following equation shows the relationship between ݓ

௢௟ௗ and

., how ݓ decreases when ݓ increases,

ݓ

௡௘௪൑ඥݏെሺݓ൅ߜሻ൏ටݏെݓ

ൌݓ

௢௟ௗ

(4.40)

e 4.18 shows an example of how regression coefficients are

ed. When the regression coefficient vector moves from the point

point B, the magnitudes of two regression coefficients change.

the sum of the squares of two regression coefficients is

ed, when ݓ increases, ݓ decreases accordingly.

The working principle of RLR. When weight vector moves from A to B, the

of two weights change. It is for sure that ݓ

൏ݓ

when ݓ

൐ݓ

.

RLR model, ߣ is a parameter to trade-off between the model

nd the model complexity or the model parsimoniousness. This is

nted in an iterated parameter shrinkage process. When ߣൌ0, a